A Linear Combination of Classifiers via Rank Margin Maximization
نویسندگان
چکیده
The method we present aims at building a weighted linear combination of already trained dichotomizers, where the weights are determined to maximize the minimum rank margin of the resulting ranking system. This is particularly suited for real applications where it is difficult to exactly determine key parameters such as costs and priors. In such cases ranking is needed rather than classification. A ranker can be seen as a more basic system than a classifier since it ranks the samples according to the value assigned by the classifier to each of them. Experiments on popular benchmarks along with a comparison with other typical rankers are proposed to show how effective can be the approach.
منابع مشابه
Weighted Order Statistic Classifiers with Large Rank-Order Margin
We investigate how stack filter function classes like weighted order statistics can be applied to classification problems. This leads to a new design criteria for linear classifiers when inputs are binary-valued and weights are positive. We present a rank-based measure of margin that is directly optimized as a standard linear program and investigate its relationship to regularization. Our appro...
متن کاملA Convex Lower Bound for the Real l2 Parametric Stability Margin of Linear Control Systems With Restricted Complexity Controllers
In this paper the problem of restricted complexity stability margin maximization (RCSMM) for single-input singleoutput (SISO) plants affected by rank one real perturbations is considered. This problem amounts to maximizing the real l2 parametric stability margin over an assigned class of restricted complexity controllers, which are described by rational transfer functions of fixed order with co...
متن کاملA method of combining multiple probabilistic classifiers through soft competition on different feature sets
A novel method is proposed for combining multiple probabilistic classifiers on different feature sets. In order to achieve the improved classification performance, a generalized finite mixture model is proposed as a linear combination scheme and implemented based on radial basis function networks. In the linear combination scheme, soft competition on different feature sets is adopted as an auto...
متن کاملHo-Kashyap with Early Stopping Versus Soft Margin SVM for Linear Classifiers - An Application
In a classification problem, hard margin SVMs tend to minimize the generalization error by maximizing the margin. Regularization is obtained with soft margin SVMs which improve performances by relaxing the constraints on the margin maximization. This article shows that comparable performances can be obtained in the linearly separable case with the Ho–Kashyap learning rule associated to early st...
متن کاملHo–Kashyap with Early Stopping vs Soft Margin SVM for Linear Classifiers – An Application
In a classification problem, hard margin SVMs tend to minimize the generalization error by maximizing the margin. Regularization is obtained with soft margin SVMs which improve performances by relaxing the constraints on the margin maximization. This article shows that comparable performances can be obtained in the linearly separable case with the Ho–Kashyap learning rule associated to early st...
متن کامل